Sensitivity to hyperprior parameters in Gaussian Bayesian networks

نویسندگان

  • Miguel Ángel Gómez-Villegas
  • Paloma Main
  • Hilario Navarro
  • Rosario Susi
چکیده

Bayesian networks (BNs) have become an essential tool for reasoning under uncertainty in complex models. In particular, the subclass of Gaussian Bayesian networks (GBNs) can be used to model continuous variables with Gaussian distributions. Here we focus on the task of learning GBNs from data. Factorization of the multivariate Gaussian joint density according to a directed acyclic graph (DAG) provides an alternative and interchangeable representation of a GBN by using the Gaussian conditional univariate densities of each variable given its parents in the DAG. With this latter conditional specification of a GBN, the learning process involves determination of the mean vector, regression coefficients and conditional variances parameters. Some approaches have been proposed to learn these parameters from a Bayesian perspective using different priors, and therefore some hyperparameter values are tuned. Our goal is to deal with the usual prior distributions given by the normal/inverse gamma form and to evaluate the effect of prior hyperparameter choice on the posterior distribution. As usual in Bayesian robustness, a large class of priors expressed by many hyperparameter values should lead to a small collection of posteriors. From this perspective and using Kullback–Leibler divergence to measure prior and posterior deviations, a local sensitivity measure is proposed to make comparisons. If a robust Bayesian analysis is developed by studying the sensitivity of Bayesian answers to uncertain inputs, this method will also be useful for selecting robust hyperparameter values. © 2013 Elsevier Inc. All rights reserved.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Gaussian Processes for Bayesian Classiication via Hybrid Monte Carlo

The full Bayesian method for applying neural networks to a prediction problem is to set up the prior/hyperprior structure for the net and then perform the necessary integrals. However, these inte-grals are not tractable analytically, and Markov Chain Monte Carlo (MCMC) methods are slow, especially if the parameter space is high-dimensional. Using Gaussian processes we can approximate the weight...

متن کامل

Hierarchical Variational Models (Appendix)

Relationship to empirical Bayes and RL. The augmentation with a variational prior has strong ties to empirical Bayesian methods, which use data to estimate hyperparameters of a prior distribution (Robbins, 1964; Efron & Morris, 1973). In general, empirical Bayes considers the fully Bayesian treatment of a hyperprior on the original prior—here, the variational prior on the original meanfield—and...

متن کامل

Gaussian Processes for Bayesian Classification via Hybrid Monte Carlo

The full Bayesian method for applying neural networks to a prediction problem is to set up the prior/hyperprior structure for the net and then perform the necessary integrals. However, these integrals are not tractable analytically, and Markov Chain Monte Carlo (MCMC) methods are slow, especially if the parameter space is high-dimensional. Using Gaussian processes we can approximate the weight ...

متن کامل

Sensitivity of Gaussian Bayesian networks to inaccuracies in their parameters

To determine the effect of a set of inaccurate parameters in Gaussian Bayesian networks, it is necessary to study the sensitivity of the model. With this aim we propose a sensitivity analysis based on comparing two different models: the original model with the initial parameters assigned to the Gaussian Bayesian network and the perturbed model obtained after perturbing a set of inaccurate param...

متن کامل

Sensitivity analysis of extreme inaccuracies in Gaussian Bayesian Networks

We present the behavior of a sensitivity measure defined to evaluate the impact of model inaccuracies over the posterior marginal density of the variable of interest, after the evidence propagation is executed, for extreme perturbations of parameters in Gaussian Bayesian networks. This sensitivity measure is based on the Kullback-Leibler divergence and yields different expressions depending on ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • J. Multivariate Analysis

دوره 124  شماره 

صفحات  -

تاریخ انتشار 2014